AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Code-Optimized Tokenizer

# Code-Optimized Tokenizer

Multilingual ModernBert Large Preview
MIT
A large multilingual BERT model developed by the Algomatic team, supporting 8192 context length, trained on approximately 60 billion tokens, suitable for mask filling tasks.
Large Language Model
M
makiart
27
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase